The Minimum Expectation Selection Problem
نویسندگان
چکیده
We define the min-min expectation selection problem (resp. max-min expectation selection problem) to be that of selecting k out of n given discrete probability distributions, to minimize (resp. maximize) the expectation of the minimum value resulting when independent random variables are drawn from the selected distributions. We assume each distribution has finitely many atoms. Let d be the number of distinct values in the support of the distributions. We show that if d is a constant greater than 2, the min-min expectation problem is NP-complete but admits a fully polynomial time approximation scheme. For d an arbitrary integer, it is NP-hard to approximate the min-min expectation problem with any constant approximation factor. The max-min expectation problem is polynomially solvable for constant d; we leave open its complexity for variable d. We also show similar results for binary selection problems in which we must choose one distribution from each of n pairs of distributions.
منابع مشابه
Estimation of Software Reliability by Sequential Testing with Simulated Annealing of Mean Field Approximation
Various problems of combinatorial optimization and permutation can be solved with neural network optimization. The problem of estimating the software reliability can be solved with the optimization of failed components to its minimum value. Various solutions of the problem of estimating the software reliability have been given. These solutions are exact and heuristic, but all the exact approach...
متن کاملThe Tail Mean-Variance Model and Extended Efficient Frontier
In portfolio theory, it is well-known that the distributions of stock returns often have non-Gaussian characteristics. Therefore, we need non-symmetric distributions for modeling and accurate analysis of actuarial data. For this purpose and optimal portfolio selection, we use the Tail Mean-Variance (TMV) model, which focuses on the rare risks but high losses and usually happens in the tail of r...
متن کاملStochastic complexity and model selection from incomplete data
The principle of minimum description length (MDL) provides an approach for selecting the model class with the smallest stochastic complexity of the data among a set of model classes. However, when only incomplete data are available the stochastic complexity for the complete data cannot be numerically computed. In this paper, this problem is solved by introducing a notion of expected stochastic ...
متن کاملMeasure Selection: Notions of Rationality and Representation Independence
We take another look at the general problem of selecting a preferred probability measure among those that comply with some given constraints. The dominant role that entropy maximization has obtained in this context is questioned by argu ing that the minimum information principle on which it is based could be supplanted by an at least as plausible "likelihood of evidence" prin ciple. We then r...
متن کاملA Gaussian Mixture Model to Detect Clusters Embedded in Feature Subspace
The goal of unsupervised learning, i.e., clustering, is to determine the intrinsic structure of unlabeled data. Feature selection for clustering improves the performance of grouping by removing irrelevant features. Typical feature selection algorithms select a common feature subset for all the clusters. Consequently, clusters embedded in different feature subspaces are not able to be identified...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Random Struct. Algorithms
دوره 21 شماره
صفحات -
تاریخ انتشار 2002